43 research outputs found

    Optimization of computer-assisted intraoperative guidance for complex oncological procedures

    Get PDF
    Mención Internacional en el título de doctorThe role of technology inside the operating room is constantly increasing, allowing surgical procedures previously considered impossible or too risky due to their complexity or limited access. These reliable tools have improved surgical efficiency and safety. Cancer treatment is one of the surgical specialties that has benefited most from these techniques due to its high incidence and the accuracy required for tumor resections with conservative approaches and clear margins. However, in many cases, introducing these technologies into surgical scenarios is expensive and entails complex setups that are obtrusive, invasive, and increase the operative time. In this thesis, we proposed convenient, accessible, reliable, and non-invasive solutions for two highly complex regions for tumor resection surgeries: pelvis and head and neck. We explored how the introduction of 3D printing, surgical navigation, and augmented reality in these scenarios provided high intraoperative precision. First, we presented a less invasive setup for osteotomy guidance in pelvic tumor resections based on small patient-specific instruments (PSIs) fabricated with a desktop 3D printer at a low cost. We evaluated their accuracy in a cadaveric study, following a realistic workflow, and obtained similar results to previous studies with more invasive setups. We also identified the ilium as the region more prone to errors. Then, we proposed surgical navigation using these small PSIs for image-to-patient registration. Artificial landmarks included in the PSIs substitute the anatomical landmarks and the bone surface commonly used for this step, which require additional bone exposure and is, therefore, more invasive. We also presented an alternative and more convenient installation of the dynamic reference frame used to track the patient movements in surgical navigation. The reference frame is inserted in a socket included in the PSIs and can be attached and detached without losing precision and simplifying the installation. We validated the setup in a cadaveric study, evaluating the accuracy and finding the optimal PSI configuration in the three most common scenarios for pelvic tumor resection. The results demonstrated high accuracy, where the main source of error was again incorrect placements of PSIs in regular and homogeneous regions such as the ilium. The main limitation of PSIs is the guidance error resulting from incorrect placements. To overcome this issue, we proposed augmented reality as a tool to guide PSI installation in the patient’s bone. We developed an application for smartphones and HoloLens 2 that displays the correct position intraoperatively. We measured the placement errors in a conventional and a realistic phantom, including a silicone layer to simulate tissue. The results demonstrated a significant reduction of errors with augmented reality compared to freehand placement, ensuring an installation of the PSI close to the target area. Finally, we proposed three setups for surgical navigation in palate tumor resections, using optical trackers and augmented reality. The tracking tools for the patient and surgical instruments were fabricated with low-cost desktop 3D printers and designed to provide less invasive setups compared to previous solutions. All setups presented similar results with high accuracy when tested in a 3D-printed patient-specific phantom. They were then validated in the real surgical case, and one of the solutions was applied for intraoperative guidance. Postoperative results demonstrated high navigation accuracy, obtaining optimal surgical outcomes. The proposed solution enabled a conservative surgical approach with a less invasive navigation setup. To conclude, in this thesis we have proposed new setups for intraoperative navigation in two complex surgical scenarios for tumor resection. We analyzed their navigation precision, defining the optimal configurations to ensure accuracy. With this, we have demonstrated that computer-assisted surgery techniques can be integrated into the surgical workflow with accessible and non-invasive setups. These results are a step further towards optimizing the procedures and continue improving surgical outcomes in complex surgical scenarios.Programa de Doctorado en Ciencia y Tecnología Biomédica por la Universidad Carlos III de MadridPresidente: Raúl San José Estépar.- Secretario: Alba González Álvarez.- Vocal: Simon Droui

    Integración de un sistema de reconocimiento facial en dispositivos móviles basado en OpenCV

    Get PDF
    Gracias a los grandes avances tecnológicos llevados a cabo en las últimas décadas, el uso de dispositivos electrónicos, y especialmente de teléfonos móviles, se ha convertido en una parte casi indispensable de nuestro día a día. Esto es en parte debido a la multitud de funciones que nos ofrecen. Por ello, actualmente el desarrollo de aplicaciones es uno de los negocios más activos. Además estos avances tecnológicos han impulsado algunos campos de investigación, como el de la biometría, con el cual es posible llevar a cabo la identificación de una persona a través de la captura de datos físicos o de comportamiento. La incorporación de los datos capturados al ordenador permite procesar la información más rápidamente, automatizando el proceso de reconocimiento. En este proyecto se unen ambos campos, el de la telefonía y el de la biometría, incorporando las técnicas desarrolladas para el reconocimiento facial en una aplicación para móviles en entornos Android. Los algoritmos aplicados para el reconocimiento se obtienen a partir de un conjunto de librerías implementadas por OpenCV, una entidad dedicada al desarrollo de código libre para aplicaciones de visión artificial. Además, la aplicación está encapsulada bajo el estándar BioAPI, creado con el fin de unificar los sistemas biométricos, permitiendo el uso de software con distintos dispositivos, independientemente del proveedor de los mismos. La función principal de la aplicación diseñada consiste en organizar, de forma casi automática, las imágenes almacenadas en un teléfono móvil o tableta, en función de los contactos añadidos por el usuario. Para ello emplea las técnicas de reconocimiento facial, analizando las imágenes almacenadas y creando modelos de decisión.The technological progress experienced in the last decades has completely changed our way of living. Nowadays, electronic devices such as mobile phones or computers are essential in our daily life. This is caused mainly because of the variety of functionalities they provide, being the development of applications one of the most profitable trades right now. Also, these technological advances have affected many areas of investigation, including biometrics, where some data extracted from physiological characteristics or behavior can be enough to identify a person. The introduction of this information in a computer allows quick processing, providing an identification automatically. In this project, both topics are combined, introducing a biometric system of facial recognition into a mobile phone application in Android. The algorithms used to apply the facial recognition are obtained from OpenCV, an open source library developed for computer vision applications. In addition, the code is encapsulated under BioAPI standard, designed mainly to unify biometric processes allowing software to work with different devices, independently of their vendors. The main functionality of this application is to organize automatically the images stored in a mobile phone, according to some contacts added by the user. This process is done by applying the algorithms, creating a model of decision.Ingeniería de Sistemas Audiovisuale

    Evaluation of optical tracking and augmented reality for needle navigation in sacral nerve stimulation

    Get PDF
    Background and objective: Sacral nerve stimulation (SNS) is a minimally invasive procedure where an electrode lead is implanted through the sacral foramina to stimulate the nerve modulating colonic and urinary functions. One of the most crucial steps in SNS procedures is the placement of the tined lead close to the sacral nerve. However, needle insertion is very challenging for surgeons. Several x-ray projections are required to interpret the needle position correctly. In many cases, multiple punctures are needed, causing an increase in surgical time and patient's discomfort and pain. In this work we propose and evaluate two different navigation systems to guide electrode placement in SNS surgeries designed to reduce surgical time, minimize patient discomfort and improve surgical outcomes. Methods: We developed, for the first alternative, an open-source navigation software to guide electrode placement by real-time needle tracking with an optical tracking system (OTS). In the second method, we present a smartphone-based AR application that displays virtual guidance elements directly on the affected area, using a 3D printed reference marker placed on the patient. This guidance facilitates needle insertion with a predefined trajectory. Both techniques were evaluated to determine which one obtained better results than the current surgical procedure. To compare the proposals with the clinical method, we developed an x-ray software tool that calculates a digitally reconstructed radiograph, simulating the fluoroscopy acquisitions during the procedure. Twelve physicians (inexperienced and experienced users) performed needle insertions through several specific targets to evaluate the alternative SNS guidance methods on a realistic patient-based phantom. Results: With each navigation solution, we observed that users took less average time to complete each insertion (36.83 s and 44.43 s for the OTS and AR methods, respectively) and needed fewer average punctures to reach the target (1.23 and 1.96 for the OTS and AR methods respectively) than following the standard clinical method (189.28 s and 3.65 punctures). Conclusions: To conclude, we have shown two navigation alternatives that could improve surgical outcome by significantly reducing needle insertions, surgical time and patient's pain in SNS procedures. We believe that these solutions are feasible to train surgeons and even replace current SNS clinical procedures.Research supported by projects PI18/01625 and AC20/00102 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund "Una manera de hacer Europa"), IND2018/TIC-9753 (Comunidad de Madrid) and project PerPlanRT (ERA Permed). Funding for APC: Universidad Carlos III de Madrid (Read & Publish Agreement CRUE-CSIC 2022)

    Combining Surgical Navigation and 3D Printing for Less Invasive Pelvic Tumor Resections

    Get PDF
    Surgical interventions for musculoskeletal tumor resection are particularly challenging in the pelvic region due to their anatomical complexity and proximity to vital structures. Several techniques, such as surgical navigation or patient-specific instruments (PSIs), have been introduced to ensure accurate resection margins. However, their inclusion usually modifies the surgical approach making it more invasive. In this study, we propose to combine both techniques to reduce this invasiveness while improving the precision of the intervention. PSIs are used for image-to-patient registration and the installation of the navigation’s reference frame. We tested and validated the proposed setup in a realistic surgical scenario with six cadavers (12 hemipelvis). The data collected during the experiment allowed us to study different resection scenarios, identifying the patient-specific instrument configurations that optimize navigation accuracy. The mean values obtained for maximum osteotomy deviation or MOD (maximum distance between the planned and actual osteotomy for each simulated scenario) were as follows: for ilium resections, 5.9 mm in the iliac crest and 1.65 mm in the supra-acetabular region, and for acetabulum resections, 3.44 mm, 1.88 mm, and 1.97 mm in the supra-acetabular, ischial and pubic regions, respectively. Additionally, those cases with image-to-patient registration error below 2 mm ensured MODs of 2 mm or lower. Our results show how combining several PSIs leads to low navigation errors and high precision while providing a less invasive surgical approach.This work was supported by the Ministerio de Ciencia e Innovación, Instituto de Salud Carlos III, and European Regional Development Fund ‘‘Una manera de hacer Europa,’’ under Project PI18/01625.Publicad

    Desktop 3D Printing: Key for Surgical Navigation in Acral Tumors?

    Get PDF
    Surgical navigation techniques have shown potential benefits in orthopedic oncologic surgery. However, the translation of these results to acral tumor resection surgeries is challenging due to the large number of joints with complex movements of the affected areas (located in distal extremities). This study proposes a surgical workflow that combines an intraoperative open-source navigation software, based on a multi-camera tracking, with desktop three-dimensional (3D) printing for accurate navigation of these tumors. Desktop 3D printing was used to fabricate patient-specific 3D printed molds to ensure that the distal extremity is in the same position both in preoperative images and during image-guided surgery (IGS). The feasibility of the proposed workflow was evaluated in two clinical cases (soft-tissue sarcomas in hand and foot). The validation involved deformation analysis of the 3D-printed mold after sterilization, accuracy of the system in patient-specific 3D-printed phantoms, and feasibility of the workflow during the surgical intervention. The sterilization process did not lead to significant deformations of the mold (mean error below 0.20 mm). The overall accuracy of the system was 1.88 mm evaluated on the phantoms. IGS guidance was feasible during both surgeries, allowing surgeons to verify enough margin during tumor resection. The results obtained have demonstrated the viability of combining open-source navigation and desktop 3D printing for acral tumor surgeries. The suggested framework can be easily personalized to any patient and could be adapted to other surgical scenarios.This work was supported by projects TEC2013-48251-C2-1-R (Ministerio de Economía y Competitividad); PI18/01625 and PI15/02121 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid).Publicad

    Augmented reality as a tool to guide psi placement in pelvic tumor resections

    Get PDF
    Patient-specific instruments (PSIs) have become a valuable tool for osteotomy guidance in complex surgical scenarios such as pelvic tumor resection. They provide similar accuracy to surgical navigation systems but are generally more convenient and faster. However, their correct placement can become challenging in some anatomical regions, and it cannot be verified objectively during the intervention. Incorrect installations can result in high deviations from the planned osteotomy, increasing the risk of positive resection margins. In this work, we propose to use augmented reality (AR) to guide and verify PSIs placement. We designed an experiment to assess the accuracy provided by the system using a smartphone and the HoloLens 2 and compared the results with the conventional freehand method. The results showed significant differences, where AR guidance prevented high osteotomy deviations, reducing maximal deviation of 54.03 mm for freehand placements to less than 5 mm with AR guidance. The experiment was performed in two versions of a plastic threedimensional (3D) printed phantom, one including a silicone layer to simulate tissue, providing more realism. We also studied how differences in shape and location of PSIs affect their accuracy, concluding that those with smaller sizes and a homogeneous target surface are more prone to errors. Our study presents promising results that prove AR’s potential to overcome the present limitations of PSIs conveniently and effectively.This research was funded by project PI18/01625 (Ministerio de Ciencia e Innovación, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”)

    Surgical navigation, augmented reality, and 3D printing for hard palate adenoid cystic carcinoma en-bloc resection: case report and literature review

    Get PDF
    Adenoid Cystic Carcinoma is a rare and aggressive tumor representing less than 1% of head and neck cancers. This malignancy often arises from the minor salivary glands, being the palate its most common location. Surgical en-bloc resection with clear margins is the primary treatment. However, this location presents a limited line of sight and a high risk of injuries, making the surgical procedure challenging. In this context, technologies such as intraoperative navigation can become an effective tool, reducing morbidity and improving the safety and accuracy of the procedure. Although their use is extended in fields such as neurosurgery, their application in maxillofacial surgery has not been widely evidenced. One reason is the need to rigidly fixate a navigation reference to the patient, which often entails an invasive setup. In this work, we studied three alternative and less invasive setups using optical tracking, 3D printing and augmented reality. We evaluated their precision in a patient-specific phantom, obtaining errors below 1 mm. The optimum setup was finally applied in a clinical case, where the navigation software was used to guide the tumor resection. Points were collected along the surgical margins after resection and compared with the real ones identified in the postoperative CT. Distances of less than 2 mm were obtained in 90% of the samples. Moreover, the navigation provided confidence to the surgeons, who could then undertake a less invasive and more conservative approach. The postoperative CT scans showed adequate resection margins and confirmed that the patient is free of disease after two years of follow-up.This work has been supported by projects PI18/01625 (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”) and IND2018/TIC-9753 (Comunidad de Madrid)

    Craniosynostosis surgery: workflow based on virtual surgical planning, intraoperative navigation and 3D printed patient-specific guides and templates

    Get PDF
    Craniosynostosis must often be corrected using surgery, by which the affected bone tissue is remodeled. Nowadays, surgical reconstruction relies mostly on the subjective judgement of the surgeon to best restore normal skull shape, since remodeled bone is manually placed and fixed. Slight variations can compromise the cosmetic outcome. The objective of this study was to describe and evaluate a novel workflow for patient-specific correction of craniosynostosis based on intraoperative navigation and 3D printing. The workflow was followed in five patients with craniosynostosis. Virtual surgical planning was performed, and patient-specific cutting guides and templates were designed and manufactured. These guides and templates were used to control osteotomies and bone remodeling. An intraoperative navigation system based on optical tracking made it possible to follow preoperative virtual planning in the operating room through real-time positioning and 3D visualization. Navigation accuracy was estimated using intraoperative surface scanning as the gold-standard. An average error of 0.62 mm and 0.64 mm was obtained in the remodeled frontal region and supraorbital bar, respectively. Intraoperative navigation is an accurate and reproducible technique for correction of craniosynostosis that enables optimal translation of the preoperative plan to the operating room. © 2019, The Author(s).This work has been supported by Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, project “PI18/01625”, co-funded by European Regional Development Fund (ERDF), “A way of making Europe”

    HoloLens 1 vs. HoloLens 2: Improvements in the New Model for Orthopedic Oncological Interventions

    Get PDF
    This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons’ feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.This work was supported by projects PI18/01625, AC20/00102-3 and Era Permed PerPlanRT (Ministerio de Ciencia, Innovación y Universidades, Instituto de Salud Carlos III, Asociación Española Contra el Cáncer and European Regional Development Fund "Una manera de hacer Europa") and IND2018/TIC-9753 (Comunidad de Madrid)

    Breast Tumor Localization by Prone to Supine Landmark Driven Registration for Surgical Planning

    Get PDF
    Breast cancer is the most common cancer in women worldwide. Screening programs and imaging improvements have increased the detection of clinically occult non-palpable lesions requiring preoperative localization. Wire guided localization (WGL) is the current standard of care for the excision of non-palpable carcinomas during breast conserving surgery. Due to the current limitations of intraoperative tumor localization approaches, the integration of multimodal imaging information may be especially relevant in surgical planning. This research proposes a novel method for performing preoperative image-to-surgical surface data alignment to determine the position of the tumor at the time of surgery and aid preoperative planning. First, the volume of the breast in the surgical position is reconstructed and a set of surface correspondences is defined. Then, the preoperative (prone) and intraoperative (supine) volumes are co-registered using landmark driven non-rigid registration methods. We compared the performances of diffeomorphic and Bspline based registration methods. Finally, our method was validated using clinical data from 67 patients considering as target registration error (TRE) the distance between the estimated tumor position and the reference surgical position. The proposed method achieved a TRE of 16.21 ± 8.18 mm and it could potentially assist the surgery planning and guidance of breast cancer treatment in the clinical practice.This work was supported in part by the Spanish Ministry of Science and Innovation under Project RTI2018-098682-B-I00 (MCIU/AEI/FEDER,UE), Project PI18/01625 (Instituto de Salud Carlos III) and Grant BGP18/00178 under Beatriz Galindo Programme; in part by the European Union's European Regional Development Fund (ERDF); and in part by the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with Universidad Politécnica de Madrid in the line Support for Research and Development Projects for Beatriz Galindo researchers, in the context of the V Plan Regional de Investigación Científíca e Innovación Tecnológica (PRICIT)
    corecore